299 research outputs found
Stable Irregular Dynamics in Complex Neural Networks
For infinitely large sparse networks of spiking neurons mean field theory
shows that a balanced state of highly irregular activity arises under various
conditions. Here we analytically investigate the microscopic irregular dynamics
in finite networks of arbitrary connectivity, keeping track of all individual
spike times. For delayed, purely inhibitory interactions we demonstrate that
the irregular dynamics is not chaotic but rather stable and convergent towards
periodic orbits. Moreover, every generic periodic orbit of these dynamical
systems is stable. These results highlight that chaotic and stable dynamics are
equally capable of generating irregular activity.Comment: 10 pages, 2 figure
Using the uncertainty principle to design simple interactions for targeted self-assembly
We present a method that systematically simplifies isotropic interactions designed for targeted self-assembly. The uncertainty principle is used to show that an optimal simplification is achieved by a combination of heat kernel smoothing and Gaussian screening of the interaction potential in real and reciprocal space. We use this method to analytically design isotropic interactions for self-assembly of complex lattices and of materials with functional properties. The derived interactions are simple enough to narrow the gap between theory and experimental implementation of theory based designed self-assembling materials
Separation of trajectories and its Relation to Entropy for Intermittent Systems with a Zero Lyapunov exponent
One dimensional intermittent maps with stretched exponential separation of
nearby trajectories are considered. When time goes infinity the standard
Lyapunov exponent is zero. We investigate the distribution of
,
where is determined by the nonlinearity of the map in the vicinity of
marginally unstable fixed points. The mean of is determined
by the infinite invariant density. Using semi analytical arguments we calculate
the infinite invariant density for the Pomeau-Manneville map, and with it
obtain excellent agreement between numerical simulation and theory. We show
that \alpha \left is equal to Krengel's entropy and
to the complexity calculated by the Lempel-Ziv compression algorithm. This
generalized Pesin's identity shows that \left and
Krengel's entropy are the natural generalizations of usual Lyapunov exponent
and entropy for these systems.Comment: 12 pages, 10 figure
Direct transition to high-dimensional chaos through a global bifurcation
In the present work we report on a genuine route by which a high-dimensional
(with d>4) chaotic attractor is created directly, i.e., without a
low-dimensional chaotic attractor as an intermediate step. The high-dimensional
chaotic set is created in a heteroclinic global bifurcation that yields an
infinite number of unstable tori.The mechanism is illustrated using a system
constructed by coupling three Lorenz oscillators. So, the route presented here
can be considered a prototype for high-dimensional chaotic behavior just as the
Lorenz model is for low-dimensional chaos.Comment: 7 page
Quantifying Self-Organization with Optimal Predictors
Despite broad interest in self-organizing systems, there are few
quantitative, experimentally-applicable criteria for self-organization. The
existing criteria all give counter-intuitive results for important cases. In
this Letter, we propose a new criterion, namely an internally-generated
increase in the statistical complexity, the amount of information required for
optimal prediction of the system's dynamics. We precisely define this
complexity for spatially-extended dynamical systems, using the probabilistic
ideas of mutual information and minimal sufficient statistics. This leads to a
general method for predicting such systems, and a simple algorithm for
estimating statistical complexity. The results of applying this algorithm to a
class of models of excitable media (cyclic cellular automata) strongly support
our proposal.Comment: Four pages, two color figure
Transmission of Information in Active Networks
Shannon's Capacity Theorem is the main concept behind the Theory of
Communication. It says that if the amount of information contained in a signal
is smaller than the channel capacity of a physical media of communication, it
can be transmitted with arbitrarily small probability of error. This theorem is
usually applicable to ideal channels of communication in which the information
to be transmitted does not alter the passive characteristics of the channel
that basically tries to reproduce the source of information. For an {\it active
channel}, a network formed by elements that are dynamical systems (such as
neurons, chaotic or periodic oscillators), it is unclear if such theorem is
applicable, once an active channel can adapt to the input of a signal, altering
its capacity. To shed light into this matter, we show, among other results, how
to calculate the information capacity of an active channel of communication.
Then, we show that the {\it channel capacity} depends on whether the active
channel is self-excitable or not and that, contrary to a current belief,
desynchronization can provide an environment in which large amounts of
information can be transmitted in a channel that is self-excitable. An
interesting case of a self-excitable active channel is a network of
electrically connected Hindmarsh-Rose chaotic neurons.Comment: 15 pages, 5 figures. submitted for publication. to appear in Phys.
Rev.
Power laws of complex systems from Extreme physical information
Many complex systems obey allometric, or power, laws y=Yx^{a}. Here y is the
measured value of some system attribute a, Y is a constant, and x is a
stochastic variable. Remarkably, for many living systems the exponent a is
limited to values +or- n/4, n=0,1,2... Here x is the mass of a randomly
selected creature in the population. These quarter-power laws hold for many
attributes, such as pulse rate (n=-1). Allometry has, in the past, been
theoretically justified on a case-by-case basis. An ultimate goal is to find a
common cause for allometry of all types and for both living and nonliving
systems. The principle I - J = extrem. of Extreme physical information (EPI) is
found to provide such a cause. It describes the flow of Fisher information J =>
I from an attribute value a on the cell level to its exterior observation y.
Data y are formed via a system channel function y = f(x,a), with f(x,a) to be
found. Extremizing the difference I - J through variation of f(x,a) results in
a general allometric law f(x,a)= y = Yx^{a}. Darwinian evolution is presumed to
cause a second extremization of I - J, now with respect to the choice of a. The
solution is a=+or-n/4, n=0,1,2..., defining the particular powers of biological
allometry. Under special circumstances, the model predicts that such biological
systems are controlled by but two distinct intracellular information sources.
These sources are conjectured to be cellular DNA and cellular transmembrane ion
gradient
Exact equqations and scaling relations for f-avalanche in the Bak-Sneppen evolution model
Infinite hierarchy of exact equations are derived for the newly-observed
f-avalanche in the Bak-Sneppen evolution model. By solving the first order
exact equation, we found that the critical exponent which governs the
divergence of the average avalanche size, is exactly 1 (for all dimensions),
confirmed by the simulations. Solution of the gap equation yields another
universal exponent, denoting the the relaxation to the attractor, is exactly 1.
We also establish some scaling relations among the critical exponents of the
new avalanche.Comment: 5 pages, 1 figur
The Computational Complexity of Symbolic Dynamics at the Onset of Chaos
In a variety of studies of dynamical systems, the edge of order and chaos has
been singled out as a region of complexity. It was suggested by Wolfram, on the
basis of qualitative behaviour of cellular automata, that the computational
basis for modelling this region is the Universal Turing Machine. In this paper,
following a suggestion of Crutchfield, we try to show that the Turing machine
model may often be too powerful as a computational model to describe the
boundary of order and chaos. In particular we study the region of the first
accumulation of period doubling in unimodal and bimodal maps of the interval,
from the point of view of language theory. We show that in relation to the
``extended'' Chomsky hierarchy, the relevant computational model in the
unimodal case is the nested stack automaton or the related indexed languages,
while the bimodal case is modeled by the linear bounded automaton or the
related context-sensitive languages.Comment: 1 reference corrected, 1 reference added, minor changes in body of
manuscrip
Automatic Filters for the Detection of Coherent Structure in Spatiotemporal Systems
Most current methods for identifying coherent structures in
spatially-extended systems rely on prior information about the form which those
structures take. Here we present two new approaches to automatically filter the
changing configurations of spatial dynamical systems and extract coherent
structures. One, local sensitivity filtering, is a modification of the local
Lyapunov exponent approach suitable to cellular automata and other discrete
spatial systems. The other, local statistical complexity filtering, calculates
the amount of information needed for optimal prediction of the system's
behavior in the vicinity of a given point. By examining the changing
spatiotemporal distributions of these quantities, we can find the coherent
structures in a variety of pattern-forming cellular automata, without needing
to guess or postulate the form of that structure. We apply both filters to
elementary and cyclical cellular automata (ECA and CCA) and find that they
readily identify particles, domains and other more complicated structures. We
compare the results from ECA with earlier ones based upon the theory of formal
languages, and the results from CCA with a more traditional approach based on
an order parameter and free energy. While sensitivity and statistical
complexity are equally adept at uncovering structure, they are based on
different system properties (dynamical and probabilistic, respectively), and
provide complementary information.Comment: 16 pages, 21 figures. Figures considerably compressed to fit arxiv
requirements; write first author for higher-resolution version
- …